Parsing Natural Scenes and Natural Language with Recursive Neural Networks
نویسندگان
چکیده
Recursive structure is commonly found in the inputs of different modalities such as natural scene images or natural language sentences. Discovering this recursive structure helps us to not only identify the units that an image or sentence contains but also how they interact to form a whole. We introduce a max-margin structure prediction architecture based on recursive neural networks that can successfully recover such structure both in complex scene images as well as sentences. The same algorithm can be used both to provide a competitive syntactic parser for natural language sentences from the Penn Treebank and to outperform alternative approaches for semantic scene segmentation, annotation and classification. For segmentation and annotation our algorithm obtains a new level of state-of-theart performance on the Stanford background dataset (78.1%). The features from the image parse tree outperform Gist descriptors for scene classification by 4%.
منابع مشابه
Second Exam: Natural Language Parsing with Neural Networks
With the advent of “deep learning”, there has been a recent resurgence of interest in the use of artificial neural networks for machine learning. This paper presents an overview of recent research in the statistical parsing of natural language sentences using such neural networks as a learning model. Though it is a fairly new addition to the toolset in this area, important results have been rec...
متن کاملLearning Continuous Phrase Representations and Syntactic Parsing with Recursive Neural Networks
Natural language parsing has typically been done with small sets of discrete categories such as NP and VP, but this representation does not capture the full syntactic nor semantic richness of linguistic phrases, and attempts to improve on this by lexicalizing phrases only partly address the problem at the cost of huge feature spaces and sparseness. To address this, we introduce a recursive neur...
متن کاملExtensions to Tree-Recursive Neural Networks for Natural Language Inference
Understanding textual entailment and contradiction is considered fundamental to natural language understanding. Tree-recursive neural networks, which exploit valuable syntactic parse information, achieve state-of-the-art accuracy among pure sentence encoding models for this task. In this course project for CS224D, we explore two extensions to tree-recursive neural networks deep TreeLSTMs and at...
متن کاملNeural Tree Indexers for Text Understanding
Recurrent neural networks (RNNs) process input text sequentially and model the conditional transition between word tokens. In contrast, the advantages of recursive networks include that they explicitly model the compositionality and the recursive structure of natural language. However, the current recursive architecture is limited by its dependence on syntactic tree. In this paper, we introduce...
متن کاملBidirectional Recursive Neural Networks for Token-Level Labeling with Structure
Recently, deep architectures, such as recurrent and recursive neural networks have been successfully applied to various natural language processing tasks. Inspired by bidirectional recurrent neural networks which use representations that summarize the past and future around an instance, we propose a novel architecture that aims to capture the structural information around an input, and use it t...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011